Yumi's Blog

Part 6 Object Detection with YOLOv2 using VOC 2012 data - inference on image

In [1]:
import matplotlib.pyplot as plt
import numpy as np
import os, sys
print(sys.version)
%matplotlib inline
3.6.3 |Anaconda, Inc.| (default, Oct  6 2017, 12:04:38) 
[GCC 4.2.1 Compatible Clang 4.0.1 (tags/RELEASE_401/final)]

Read in the hyperparameters to define the YOLOv2 model used during training

In [2]:
train_image_folder = "../ObjectDetectionRCNN/VOCdevkit/VOC2012/JPEGImages/"
train_annot_folder = "../ObjectDetectionRCNN/VOCdevkit/VOC2012/Annotations/"

LABELS = ['aeroplane',  'bicycle', 'bird',  'boat',      'bottle', 
          'bus',        'car',      'cat',  'chair',     'cow',
          'diningtable','dog',    'horse',  'motorbike', 'person',
          'pottedplant','sheep',  'sofa',   'train',   'tvmonitor']

ANCHORS = np.array([1.07709888,  1.78171903,  # anchor box 1, width , height
                    2.71054693,  5.12469308,  # anchor box 2, width,  height
                   10.47181473, 10.09646365,  # anchor box 3, width,  height
                    5.48531347,  8.11011331]) # anchor box 4, width,  height


BOX               = int(len(ANCHORS)/2)
TRUE_BOX_BUFFER   = 50
IMAGE_H, IMAGE_W  = 416, 416
GRID_H,  GRID_W   = 13 , 13

Load the weights trained in Part 5

In [3]:
from backend import define_YOLOv2

CLASS             = len(LABELS)
model, _          = define_YOLOv2(IMAGE_H,IMAGE_W,GRID_H,GRID_W,TRUE_BOX_BUFFER,BOX,CLASS, 
                                  trainable=False)
model.load_weights("weights_yumi.h5")
/Users/yumikondo/anaconda3/lib/python3.6/site-packages/h5py/__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters
Using TensorFlow backend.

Perform detection on sample image

Encode the image using the ImageReader class

ImageReader class was created in Part 2.

In [4]:
## input encoding
from backend import ImageReader
imageReader = ImageReader(IMAGE_H,IMAGE_W=IMAGE_W, norm=lambda image : image / 255.)
out = imageReader.fit(train_image_folder + "/2007_005430.jpg")

Predict the bounding box.

In [5]:
print(out.shape)
X_test = np.expand_dims(out,0)
print(X_test.shape)
# handle the hack input
dummy_array = np.zeros((1,1,1,1,TRUE_BOX_BUFFER,4))
y_pred = model.predict([X_test,dummy_array])
print(y_pred.shape)
(416, 416, 3)
(1, 416, 416, 3)
(1, 13, 13, 4, 25)

Rescale the network output

Remind you that y_pred takes any real values. Therefore

In [6]:
class OutputRescaler(object):
    def __init__(self,ANCHORS):
        self.ANCHORS = ANCHORS

    def _sigmoid(self, x):
        return 1. / (1. + np.exp(-x))
    def _softmax(self, x, axis=-1, t=-100.):
        x = x - np.max(x)

        if np.min(x) < t:
            x = x/np.min(x)*t

        e_x = np.exp(x)
        return e_x / e_x.sum(axis, keepdims=True)
    def get_shifting_matrix(self,netout):
        
        GRID_H, GRID_W, BOX = netout.shape[:3]
        no = netout[...,0]
        
        ANCHORSw = self.ANCHORS[::2]
        ANCHORSh = self.ANCHORS[1::2]
       
        mat_GRID_W = np.zeros_like(no)
        for igrid_w in range(GRID_W):
            mat_GRID_W[:,igrid_w,:] = igrid_w

        mat_GRID_H = np.zeros_like(no)
        for igrid_h in range(GRID_H):
            mat_GRID_H[igrid_h,:,:] = igrid_h

        mat_ANCHOR_W = np.zeros_like(no)
        for ianchor in range(BOX):    
            mat_ANCHOR_W[:,:,ianchor] = ANCHORSw[ianchor]

        mat_ANCHOR_H = np.zeros_like(no) 
        for ianchor in range(BOX):    
            mat_ANCHOR_H[:,:,ianchor] = ANCHORSh[ianchor]
        return(mat_GRID_W,mat_GRID_H,mat_ANCHOR_W,mat_ANCHOR_H)

    def fit(self, netout):    
        '''
        netout  : np.array of shape (N grid h, N grid w, N anchor, 4 + 1 + N class)
        
        a single image output of model.predict()
        '''
        GRID_H, GRID_W, BOX = netout.shape[:3]
        
        (mat_GRID_W,
         mat_GRID_H,
         mat_ANCHOR_W,
         mat_ANCHOR_H) = self.get_shifting_matrix(netout)


        # bounding box parameters
        netout[..., 0]   = (self._sigmoid(netout[..., 0]) + mat_GRID_W)/GRID_W # x      unit: range between 0 and 1
        netout[..., 1]   = (self._sigmoid(netout[..., 1]) + mat_GRID_H)/GRID_H # y      unit: range between 0 and 1
        netout[..., 2]   = (np.exp(netout[..., 2]) + mat_ANCHOR_W)/GRID_W      # width  unit: range between 0 and 1
        netout[..., 3]   = (np.exp(netout[..., 3]) + mat_ANCHOR_H)/GRID_H      # height unit: range between 0 and 1
        # rescale the confidence to range 0 and 1 
        netout[..., 4]   = self._sigmoid(netout[..., 4])
        expand_conf      = np.expand_dims(netout[...,4],-1) # (N grid h , N grid w, N anchor , 1)
        # rescale the class probability to range between 0 and 1
        # Pr(object class = k) = Pr(object exists) * Pr(object class = k |object exists)
        #                      = Conf * P^c
        netout[..., 5:]  = expand_conf * self._softmax(netout[..., 5:])
        # ignore the class probability if it is less than obj_threshold 
    
        return(netout)

Experiment OutputRescaler

In [7]:
netout         = y_pred[0]
outputRescaler = OutputRescaler(ANCHORS=ANCHORS)
netout_scale   = outputRescaler.fit(netout)

Post processing the YOLOv2 object

YOLOv2 can potentially preoduce GRID_H x GRID_W x BOX many bounding box. However, only few of them actually contain actual objects. Some bounding box may contain the same objects. I will postprocess the predicted bounding box.

In [8]:
from backend import BoundBox

    
def find_high_class_probability_bbox(netout_scale, obj_threshold):
    '''
    == Input == 
    netout : y_pred[i] np.array of shape (GRID_H, GRID_W, BOX, 4 + 1 + N class)
    
             x, w must be a unit of image width
             y, h must be a unit of image height
             c must be in between 0 and 1
             p^c must be in between 0 and 1
    == Output ==
    
    boxes  : list containing bounding box with Pr(object is in class C) > 0 for at least in one class C 
    
             
    '''
    GRID_H, GRID_W, BOX = netout_scale.shape[:3]
    
    boxes = []
    for row in range(GRID_H):
        for col in range(GRID_W):
            for b in range(BOX):
                # from 4th element onwards are confidence and class classes
                classes = netout_scale[row,col,b,5:]
                
                if np.sum(classes) > 0:
                    # first 4 elements are x, y, w, and h
                    x, y, w, h = netout_scale[row,col,b,:4]
                    confidence = netout_scale[row,col,b,4]
                    box = BoundBox(x-w/2, y-h/2, x+w/2, y+h/2, confidence, classes)
                    if box.get_score() > obj_threshold:
                        boxes.append(box)
    return(boxes)

Experiment find_high_class_probability_bbox

In [9]:
obj_threshold = 0.015
boxes_tiny_threshold = find_high_class_probability_bbox(netout_scale,obj_threshold)
print("obj_threshold={}".format(obj_threshold))
print("In total, YOLO can produce GRID_H * GRID_W * BOX = {} bounding boxes ".format( GRID_H * GRID_W * BOX))
print("I found {} bounding boxes with top class probability > {}".format(len(boxes_tiny_threshold),obj_threshold))

obj_threshold = 0.03
boxes = find_high_class_probability_bbox(netout_scale,obj_threshold)
print("\nobj_threshold={}".format(obj_threshold))
print("In total, YOLO can produce GRID_H * GRID_W * BOX = {} bounding boxes ".format( GRID_H * GRID_W * BOX))
print("I found {} bounding boxes with top class probability > {}".format(len(boxes),obj_threshold))
obj_threshold=0.015
In total, YOLO can produce GRID_H * GRID_W * BOX = 676 bounding boxes 
I found 51 bounding boxes with top class probability > 0.015

obj_threshold=0.03
In total, YOLO can produce GRID_H * GRID_W * BOX = 676 bounding boxes 
I found 34 bounding boxes with top class probability > 0.03

Visualize many bounding box by having small obj_threshold value

Most of the bounding boxes do not contain objects. This shows that we really need to reduce the number of bounding box.

In [10]:
import cv2, copy
def draw_boxes(image, boxes, labels, obj_baseline=0.05,verbose=False):
    '''
    image : np.array of shape (N height, N width, 3)
    '''
    def adjust_minmax(c,_max):
        if c < 0:
            c = 0   
        if c > _max:
            c = _max
        return c
    
    image = copy.deepcopy(image)
    image_h, image_w, _ = image.shape
    score_rescaled  = np.array([box.get_score() for box in boxes])
    score_rescaled /= obj_baseline
    for sr, box in zip(score_rescaled,boxes):
        xmin = adjust_minmax(int(box.xmin*image_w),image_w)
        ymin = adjust_minmax(int(box.ymin*image_h),image_h)
        xmax = adjust_minmax(int(box.xmax*image_w),image_w)
        ymax = adjust_minmax(int(box.ymax*image_h),image_h)
 
        
        text = "{:10} {:4.3f}".format(labels[box.label], box.get_score())
        if verbose:
            print("{} xmin={:4.0f},ymin={:4.0f},xmax={:4.0f},ymax={:4.0f}".format(text,xmin,ymin,xmax,ymax,text))
        cv2.rectangle(image, 
                      pt1=(xmin,ymin), 
                      pt2=(xmax,ymax), 
                      color=(0,1,0), 
                      thickness=sr)
        cv2.putText(img       = image, 
                    text      = text, 
                    org       = (xmin+ 13, ymin + 13),
                    fontFace  = cv2.FONT_HERSHEY_SIMPLEX,
                    fontScale = 1e-3 * image_h,
                    color     = (1, 0, 1),
                    thickness = 1)
        
    return image


print("Plot with low object threshold")
ima = draw_boxes(X_test[0],boxes_tiny_threshold,LABELS,verbose=True)
figsize = (15,15)
plt.figure(figsize=figsize)
plt.imshow(ima); 
plt.title("Plot with low object threshold")
plt.show()

print("Plot with high object threshold")
ima = draw_boxes(X_test[0],boxes,LABELS,verbose=True)
figsize = (15,15)
plt.figure(figsize=figsize)
plt.imshow(ima); 
plt.title("Plot with high object threshold")
plt.show()
Plot with low object threshold
person     0.021 xmin= 118,ymin=  13,xmax= 238,ymax= 214
person     0.127 xmin= 176,ymin=  23,xmax= 303,ymax= 219
person     0.058 xmin= 136,ymin=   0,xmax= 342,ymax= 267
person     0.015 xmin= 202,ymin=  21,xmax= 325,ymax= 217
person     0.029 xmin=   0,ymin=   0,xmax= 153,ymax= 289
person     0.029 xmin=   0,ymin=   0,xmax= 182,ymax= 286
person     0.020 xmin=   0,ymin=   0,xmax= 209,ymax= 286
person     0.041 xmin=  85,ymin=  48,xmax= 206,ymax= 249
person     0.093 xmin= 117,ymin=  46,xmax= 235,ymax= 249
person     0.029 xmin=  70,ymin=   2,xmax= 279,ymax= 296
person     0.102 xmin= 158,ymin=  46,xmax= 273,ymax= 248
person     0.166 xmin= 116,ymin=   7,xmax= 320,ymax= 299
person     0.834 xmin= 178,ymin=  47,xmax= 302,ymax= 260
person     0.884 xmin= 143,ymin=  10,xmax= 340,ymax= 303
person     0.088 xmin= 200,ymin=  45,xmax= 320,ymax= 251
person     0.202 xmin= 158,ymin=   4,xmax= 361,ymax= 299
person     0.025 xmin= 244,ymin=  43,xmax= 363,ymax= 244
person     0.015 xmin= 198,ymin=   0,xmax= 406,ymax= 292
bicycle    0.030 xmin=  55,ymin=  75,xmax= 176,ymax= 276
bicycle    0.026 xmin=   3,ymin=  28,xmax= 215,ymax= 318
bicycle    0.294 xmin=  85,ymin=  83,xmax= 210,ymax= 278
bicycle    0.123 xmin=  42,ymin=  40,xmax= 250,ymax= 326
bicycle    0.363 xmin= 109,ymin=  82,xmax= 231,ymax= 281
bicycle    0.114 xmin=  74,ymin=  41,xmax= 279,ymax= 329
person     0.318 xmin= 158,ymin=  68,xmax= 272,ymax= 272
person     0.422 xmin= 118,ymin=  25,xmax= 321,ymax= 318
person     0.954 xmin= 185,ymin=  55,xmax= 304,ymax= 280
person     0.974 xmin= 146,ymin=  18,xmax= 341,ymax= 315
person     0.103 xmin= 203,ymin=  64,xmax= 319,ymax= 276
person     0.270 xmin= 158,ymin=  19,xmax= 359,ymax= 315
person     0.044 xmin= 244,ymin=  71,xmax= 361,ymax= 275
person     0.016 xmin= 196,ymin=  22,xmax= 401,ymax= 319
bicycle    0.110 xmin=  58,ymin= 106,xmax= 181,ymax= 305
bicycle    0.047 xmin=   5,ymin=  64,xmax= 220,ymax= 349
bicycle    0.257 xmin=  85,ymin= 112,xmax= 220,ymax= 304
bicycle    0.108 xmin=  50,ymin=  71,xmax= 255,ymax= 351
bicycle    0.548 xmin=  98,ymin= 107,xmax= 232,ymax= 303
bicycle    0.110 xmin=  67,ymin=  69,xmax= 269,ymax= 349
bicycle    0.054 xmin= 145,ymin=  94,xmax= 260,ymax= 299
bicycle    0.024 xmin= 104,ymin=  53,xmax= 310,ymax= 339
person     0.041 xmin= 184,ymin=  89,xmax= 302,ymax= 302
person     0.040 xmin= 140,ymin=  47,xmax= 341,ymax= 340
person     0.022 xmin= 244,ymin= 104,xmax= 363,ymax= 303
bicycle    0.033 xmin=  52,ymin= 138,xmax= 173,ymax= 331
bicycle    0.022 xmin=   6,ymin=  90,xmax= 218,ymax= 376
bicycle    0.044 xmin=  82,ymin= 135,xmax= 214,ymax= 326
bicycle    0.039 xmin=  48,ymin=  91,xmax= 256,ymax= 371
bicycle    0.031 xmin= 101,ymin= 131,xmax= 231,ymax= 323
bicycle    0.021 xmin=  67,ymin=  91,xmax= 272,ymax= 372
bicycle    0.018 xmin= 184,ymin= 134,xmax= 302,ymax= 330
bicycle    0.015 xmin=   6,ymin= 127,xmax= 217,ymax= 415
Plot with high object threshold
person     0.127 xmin= 176,ymin=  23,xmax= 303,ymax= 219
person     0.058 xmin= 136,ymin=   0,xmax= 342,ymax= 267
person     0.041 xmin=  85,ymin=  48,xmax= 206,ymax= 249
person     0.093 xmin= 117,ymin=  46,xmax= 235,ymax= 249
person     0.102 xmin= 158,ymin=  46,xmax= 273,ymax= 248
person     0.166 xmin= 116,ymin=   7,xmax= 320,ymax= 299
person     0.834 xmin= 178,ymin=  47,xmax= 302,ymax= 260
person     0.884 xmin= 143,ymin=  10,xmax= 340,ymax= 303
person     0.088 xmin= 200,ymin=  45,xmax= 320,ymax= 251
person     0.202 xmin= 158,ymin=   4,xmax= 361,ymax= 299
bicycle    0.294 xmin=  85,ymin=  83,xmax= 210,ymax= 278
bicycle    0.123 xmin=  42,ymin=  40,xmax= 250,ymax= 326
bicycle    0.363 xmin= 109,ymin=  82,xmax= 231,ymax= 281
bicycle    0.114 xmin=  74,ymin=  41,xmax= 279,ymax= 329
person     0.318 xmin= 158,ymin=  68,xmax= 272,ymax= 272
person     0.422 xmin= 118,ymin=  25,xmax= 321,ymax= 318
person     0.954 xmin= 185,ymin=  55,xmax= 304,ymax= 280
person     0.974 xmin= 146,ymin=  18,xmax= 341,ymax= 315
person     0.103 xmin= 203,ymin=  64,xmax= 319,ymax= 276
person     0.270 xmin= 158,ymin=  19,xmax= 359,ymax= 315
person     0.044 xmin= 244,ymin=  71,xmax= 361,ymax= 275
bicycle    0.110 xmin=  58,ymin= 106,xmax= 181,ymax= 305
bicycle    0.047 xmin=   5,ymin=  64,xmax= 220,ymax= 349
bicycle    0.257 xmin=  85,ymin= 112,xmax= 220,ymax= 304
bicycle    0.108 xmin=  50,ymin=  71,xmax= 255,ymax= 351
bicycle    0.548 xmin=  98,ymin= 107,xmax= 232,ymax= 303
bicycle    0.110 xmin=  67,ymin=  69,xmax= 269,ymax= 349
bicycle    0.054 xmin= 145,ymin=  94,xmax= 260,ymax= 299
person     0.041 xmin= 184,ymin=  89,xmax= 302,ymax= 302
person     0.040 xmin= 140,ymin=  47,xmax= 341,ymax= 340
bicycle    0.033 xmin=  52,ymin= 138,xmax= 173,ymax= 331
bicycle    0.044 xmin=  82,ymin= 135,xmax= 214,ymax= 326
bicycle    0.039 xmin=  48,ymin=  91,xmax= 256,ymax= 371
bicycle    0.031 xmin= 101,ymin= 131,xmax= 231,ymax= 323

Nonmax surpression

Nonmax surpression is a way to detect a single object only once. Andrew Ng has presented the idea of nonmax supression in his lecture very well: C4W3L07 Nonmax Suppression.

The following code implement the nonmax surpression algorithm. For each object class, the algorithm picks the most promissing bounding box, and then remove (or suppress) the remaining bounding box that contain high overwrap with the most promissing bounding box. The most promissing or not is determined by the predicted class probaiblity.

In [11]:
from backend import BestAnchorBoxFinder
def nonmax_suppression(boxes,iou_threshold,obj_threshold):
    '''
    boxes : list containing "good" BoundBox of a frame
            [BoundBox(),BoundBox(),...]
    '''
    bestAnchorBoxFinder    = BestAnchorBoxFinder([])
    
    CLASS    = len(boxes[0].classes)
    index_boxes = []   
    # suppress non-maximal boxes
    for c in range(CLASS):
        # extract class probabilities of the c^th class from multiple bbox
        class_probability_from_bbxs = [box.classes[c] for box in boxes]

        #sorted_indices[i] contains the i^th largest class probabilities
        sorted_indices = list(reversed(np.argsort( class_probability_from_bbxs)))

        for i in range(len(sorted_indices)):
            index_i = sorted_indices[i]
            
            # if class probability is zero then ignore
            if boxes[index_i].classes[c] == 0:  
                continue
            else:
                index_boxes.append(index_i)
                for j in range(i+1, len(sorted_indices)):
                    index_j = sorted_indices[j]
                    
                    # check if the selected i^th bounding box has high IOU with any of the remaining bbox
                    # if so, the remaining bbox' class probabilities are set to 0.
                    bbox_iou = bestAnchorBoxFinder.bbox_iou(boxes[index_i], boxes[index_j])
                    if bbox_iou >= iou_threshold:
                        classes = boxes[index_j].classes
                        classes[c] = 0
                        boxes[index_j].set_class(classes)
                        
    newboxes = [ boxes[i] for i in index_boxes if boxes[i].get_score() > obj_threshold ]                
    
    return newboxes 

Experiment nonmax_suppression

In [12]:
iou_threshold = 0.01
final_boxes = nonmax_suppression(boxes,iou_threshold=iou_threshold,obj_threshold=obj_threshold)
print("{} final number of boxes".format(len(final_boxes)))
2 final number of boxes

Finally draw the bounding box on an wapred image

In [13]:
ima = draw_boxes(X_test[0],final_boxes,LABELS,verbose=True)
figsize = (15,15)
plt.figure(figsize=figsize)
plt.imshow(ima); 
plt.show()
bicycle    0.548 xmin=  98,ymin= 107,xmax= 232,ymax= 303
person     0.974 xmin= 146,ymin=  18,xmax= 341,ymax= 315

More examples

In [14]:
np.random.seed(1)
Nsample   = 20
image_nms = list(np.random.choice(os.listdir(train_image_folder),Nsample))
In [15]:
outputRescaler = OutputRescaler(ANCHORS=ANCHORS)
imageReader    = ImageReader(IMAGE_H,IMAGE_W=IMAGE_W, norm=lambda image : image / 255.)
X_test         = []
for img_nm in image_nms:
    _path    = os.path.join(train_image_folder,img_nm)
    out      = imageReader.fit(_path)
    X_test.append(out)

X_test = np.array(X_test)

## model
dummy_array    = np.zeros((len(X_test),1,1,1,TRUE_BOX_BUFFER,4))
y_pred         = model.predict([X_test,dummy_array])

for iframe in range(len(y_pred)):
        netout         = y_pred[iframe] 
        netout_scale   = outputRescaler.fit(netout)
        boxes          = find_high_class_probability_bbox(netout_scale,obj_threshold)
        if len(boxes) > 0:
            final_boxes    = nonmax_suppression(boxes,
                                                iou_threshold=iou_threshold,
                                                obj_threshold=obj_threshold)
            ima = draw_boxes(X_test[iframe],final_boxes,LABELS,verbose=True)
            plt.figure(figsize=figsize)
            plt.imshow(ima); 
            plt.show()
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
person     0.118 xmin=   0,ymin= 178,xmax= 103,ymax= 367
person     0.118 xmin=   0,ymin= 178,xmax= 103,ymax= 367
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
person     0.118 xmin=   0,ymin= 178,xmax= 103,ymax= 367
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
person     0.118 xmin=   0,ymin= 178,xmax= 103,ymax= 367
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
person     0.118 xmin=   0,ymin= 178,xmax= 103,ymax= 367
person     0.118 xmin=   0,ymin= 178,xmax= 103,ymax= 367
person     0.118 xmin=   0,ymin= 178,xmax= 103,ymax= 367
person     0.118 xmin=   0,ymin= 178,xmax= 103,ymax= 367
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
pottedplant 0.040 xmin= 254,ymin= 169,xmax= 367,ymax= 361
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
pottedplant 0.059 xmin=  82,ymin= 142,xmax= 203,ymax= 326
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.158 xmin=  61,ymin= 234,xmax= 166,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
person     0.037 xmin= 310,ymin= 237,xmax= 416,ymax= 416
chair      0.242 xmin=  59,ymin= 122,xmax= 416,ymax= 416
person     0.890 xmin=  61,ymin= 100,xmax= 416,ymax= 416
sofa       0.576 xmin=  29,ymin= 123,xmax= 406,ymax= 416
sofa       0.036 xmin= 195,ymin= 130,xmax= 410,ymax= 414
sofa       0.036 xmin= 195,ymin= 130,xmax= 410,ymax= 414
chair      0.032 xmin=  13,ymin= 153,xmax= 226,ymax= 416
sofa       0.036 xmin= 195,ymin= 130,xmax= 410,ymax= 414
person     0.503 xmin=   3,ymin=  32,xmax= 210,ymax= 330
sofa       0.036 xmin= 195,ymin= 130,xmax= 410,ymax= 414
cow        0.677 xmin=  30,ymin=  35,xmax= 394,ymax= 377
aeroplane  0.106 xmin= 130,ymin=   9,xmax= 344,ymax= 289
person     0.035 xmin=  47,ymin=  38,xmax= 415,ymax= 390
chair      0.058 xmin= 129,ymin=  91,xmax= 338,ymax= 384
chair      0.058 xmin= 129,ymin=  91,xmax= 338,ymax= 384
chair      0.058 xmin= 129,ymin=  91,xmax= 338,ymax= 384
chair      0.058 xmin= 129,ymin=  91,xmax= 338,ymax= 384
person     0.363 xmin=  72,ymin=   0,xmax= 272,ymax= 263
chair      0.058 xmin= 129,ymin=  91,xmax= 338,ymax= 384
chair      0.058 xmin= 129,ymin=  91,xmax= 338,ymax= 384
car        0.038 xmin=   0,ymin=   6,xmax= 183,ymax= 295
car        0.038 xmin=   0,ymin=   6,xmax= 183,ymax= 295
car        0.038 xmin=   0,ymin=   6,xmax= 183,ymax= 295
cat        0.723 xmin=  27,ymin=  61,xmax= 398,ymax= 416
sofa       0.041 xmin=  24,ymin= 120,xmax= 397,ymax= 416
sofa       0.041 xmin=  24,ymin= 120,xmax= 397,ymax= 416
sofa       0.041 xmin=  24,ymin= 120,xmax= 397,ymax= 416
dog        0.202 xmin=  49,ymin=  86,xmax= 416,ymax= 416
car        0.038 xmin=   0,ymin=   6,xmax= 183,ymax= 295
car        0.038 xmin=   0,ymin=   6,xmax= 183,ymax= 295
car        0.038 xmin=   0,ymin=   6,xmax= 183,ymax= 295
sofa       0.041 xmin=  24,ymin= 120,xmax= 397,ymax= 416
car        0.038 xmin=   0,ymin=   6,xmax= 183,ymax= 295
car        0.038 xmin=   0,ymin=   6,xmax= 183,ymax= 295
aeroplane  0.041 xmin=  21,ymin=   9,xmax= 396,ymax= 359
aeroplane  0.041 xmin=  21,ymin=   9,xmax= 396,ymax= 359
person     0.876 xmin=  19,ymin=  66,xmax= 394,ymax= 416
aeroplane  0.041 xmin=  21,ymin=   9,xmax= 396,ymax= 359
aeroplane  0.750 xmin=  22,ymin=   8,xmax= 390,ymax= 349
aeroplane  0.750 xmin=  22,ymin=   8,xmax= 390,ymax= 349
aeroplane  0.750 xmin=  22,ymin=   8,xmax= 390,ymax= 349
aeroplane  0.750 xmin=  22,ymin=   8,xmax= 390,ymax= 349
aeroplane  0.750 xmin=  22,ymin=   8,xmax= 390,ymax= 349
aeroplane  0.750 xmin=  22,ymin=   8,xmax= 390,ymax= 349
boat       0.286 xmin=  89,ymin= 133,xmax= 416,ymax= 416
person     0.040 xmin=  19,ymin=  70,xmax= 134,ymax= 271
person     0.033 xmin= 186,ymin= 169,xmax= 305,ymax= 363
chair      0.041 xmin= 226,ymin= 163,xmax= 416,ymax= 416
chair      0.041 xmin= 226,ymin= 163,xmax= 416,ymax= 416
dog        0.059 xmin= 110,ymin= 208,xmax= 238,ymax= 401
dog        0.059 xmin= 110,ymin= 208,xmax= 238,ymax= 401
dog        0.059 xmin= 110,ymin= 208,xmax= 238,ymax= 401
dog        0.059 xmin= 110,ymin= 208,xmax= 238,ymax= 401
dog        0.059 xmin= 110,ymin= 208,xmax= 238,ymax= 401
dog        0.059 xmin= 110,ymin= 208,xmax= 238,ymax= 401
chair      0.041 xmin= 226,ymin= 163,xmax= 416,ymax= 416
dog        0.059 xmin= 110,ymin= 208,xmax= 238,ymax= 401
dog        0.059 xmin= 110,ymin= 208,xmax= 238,ymax= 401
person     0.519 xmin= 106,ymin=  64,xmax= 309,ymax= 362
dog        0.059 xmin= 110,ymin= 208,xmax= 238,ymax= 401
sofa       0.081 xmin=  19,ymin=  60,xmax= 389,ymax= 416
dog        0.059 xmin= 110,ymin= 208,xmax= 238,ymax= 401
dog        0.046 xmin=  27,ymin=  38,xmax= 388,ymax= 390
person     0.035 xmin= 101,ymin=  69,xmax= 320,ymax= 361
person     0.035 xmin= 101,ymin=  69,xmax= 320,ymax= 361
dog        0.046 xmin=  27,ymin=  38,xmax= 388,ymax= 390
sheep      0.288 xmin=  99,ymin=  84,xmax= 320,ymax= 374
person     0.035 xmin= 101,ymin=  69,xmax= 320,ymax= 361
dog        0.046 xmin=  27,ymin=  38,xmax= 388,ymax= 390
dog        0.046 xmin=  27,ymin=  38,xmax= 388,ymax= 390
sheep      0.288 xmin=  99,ymin=  84,xmax= 320,ymax= 374
person     0.035 xmin= 101,ymin=  69,xmax= 320,ymax= 361
dog        0.046 xmin=  27,ymin=  38,xmax= 388,ymax= 390
sheep      0.288 xmin=  99,ymin=  84,xmax= 320,ymax= 374
person     0.035 xmin= 101,ymin=  69,xmax= 320,ymax= 361
person     0.035 xmin= 101,ymin=  69,xmax= 320,ymax= 361
dog        0.046 xmin=  27,ymin=  38,xmax= 388,ymax= 390
sheep      0.288 xmin=  99,ymin=  84,xmax= 320,ymax= 374
person     0.035 xmin= 101,ymin=  69,xmax= 320,ymax= 361
dog        0.046 xmin=  27,ymin=  38,xmax= 388,ymax= 390
dog        0.046 xmin=  27,ymin=  38,xmax= 388,ymax= 390
sofa       0.074 xmin=  27,ymin= 130,xmax= 400,ymax= 416
sofa       0.074 xmin=  27,ymin= 130,xmax= 400,ymax= 416
chair      0.033 xmin=   0,ymin= 222,xmax= 106,ymax= 416
chair      0.033 xmin=   0,ymin= 222,xmax= 106,ymax= 416
chair      0.033 xmin=   0,ymin= 222,xmax= 106,ymax= 416
chair      0.033 xmin=   0,ymin= 222,xmax= 106,ymax= 416
diningtable 0.069 xmin=  31,ymin= 164,xmax= 402,ymax= 416
chair      0.033 xmin=   0,ymin= 222,xmax= 106,ymax= 416
person     0.761 xmin=   0,ymin= 195,xmax= 109,ymax= 403
person     0.489 xmin= 109,ymin= 138,xmax= 223,ymax= 336
person     0.276 xmin= 305,ymin= 164,xmax= 416,ymax= 375
chair      0.033 xmin=   0,ymin= 222,xmax= 106,ymax= 416
chair      0.033 xmin=   0,ymin= 222,xmax= 106,ymax= 416
sofa       0.074 xmin=  27,ymin= 130,xmax= 400,ymax= 416
sofa       0.074 xmin=  27,ymin= 130,xmax= 400,ymax= 416
chair      0.033 xmin=   0,ymin= 222,xmax= 106,ymax= 416
chair      0.034 xmin=  74,ymin=   2,xmax= 278,ymax= 293
person     0.922 xmin=  65,ymin= 117,xmax= 277,ymax= 416
chair      0.034 xmin=  74,ymin=   2,xmax= 278,ymax= 293
chair      0.034 xmin=  74,ymin=   2,xmax= 278,ymax= 293
bicycle    0.030 xmin=  55,ymin=  25,xmax= 416,ymax= 388
bicycle    0.030 xmin=  55,ymin=  25,xmax= 416,ymax= 388
person     0.037 xmin=  57,ymin=   0,xmax= 416,ymax= 356
bicycle    0.030 xmin=  55,ymin=  25,xmax= 416,ymax= 388
bicycle    0.030 xmin=  55,ymin=  25,xmax= 416,ymax= 388
cat        0.068 xmin= 131,ymin= 161,xmax= 358,ymax= 416
chair      0.416 xmin= 227,ymin=   0,xmax= 416,ymax= 264
diningtable 0.040 xmin=  60,ymin= 190,xmax= 416,ymax= 416
person     0.037 xmin=  57,ymin=   0,xmax= 416,ymax= 356
sofa       0.033 xmin=  64,ymin= 159,xmax= 416,ymax= 416
person     0.037 xmin=  57,ymin=   0,xmax= 416,ymax= 356
aeroplane  0.496 xmin=   6,ymin=  48,xmax= 220,ymax= 321
aeroplane  0.051 xmin= 225,ymin=  77,xmax= 416,ymax= 348
aeroplane  0.051 xmin= 225,ymin=  77,xmax= 416,ymax= 348
aeroplane  0.051 xmin= 225,ymin=  77,xmax= 416,ymax= 348
aeroplane  0.051 xmin= 225,ymin=  77,xmax= 416,ymax= 348
aeroplane  0.051 xmin= 225,ymin=  77,xmax= 416,ymax= 348
aeroplane  0.051 xmin= 225,ymin=  77,xmax= 416,ymax= 348
aeroplane  0.496 xmin=   6,ymin=  48,xmax= 220,ymax= 321
aeroplane  0.051 xmin= 225,ymin=  77,xmax= 416,ymax= 348
aeroplane  0.051 xmin= 225,ymin=  77,xmax= 416,ymax= 348
aeroplane  0.051 xmin= 225,ymin=  77,xmax= 416,ymax= 348
aeroplane  0.496 xmin=   6,ymin=  48,xmax= 220,ymax= 321
aeroplane  0.051 xmin= 225,ymin=  77,xmax= 416,ymax= 348
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
boat       0.090 xmin=  51,ymin=  32,xmax= 416,ymax= 385
person     0.671 xmin= 231,ymin= 243,xmax= 334,ymax= 416
horse      0.900 xmin= 101,ymin= 165,xmax= 303,ymax= 416
person     0.808 xmin= 134,ymin= 114,xmax= 238,ymax= 314

FairyOnIce/ObjectDetectionYolo contains this ipython notebook and all the functions that I defined in this notebook.

Comments